# Threat Intelligence Analysis

Foundation Sec 8B Instruct
Apache-2.0
Foundation-Sec-8B is an open-source 8-billion-parameter foundational language model specifically designed for cybersecurity applications, extending the Llama-3.1-8B model.
Large Language Model Transformers English
F
2p8xx
255
3
Foundation Sec 8B
Apache-2.0
Foundation-Sec-8B is an 8-billion-parameter foundational language model specifically designed for cybersecurity, based on the Llama-3.1-8B extension, suitable for security scenarios such as threat detection and vulnerability assessment.
Large Language Model Transformers English
F
fdtn-ai
25.24k
168
Llama Primus Nemotron 70B Base
MIT
A cybersecurity large language model continuously trained based on nvidia/Llama-3.1-Nemotron-70B-Instruct, achieving an 11.19% improvement in comprehensive scores across multiple cybersecurity benchmarks.
Large Language Model Transformers Supports Multiple Languages
L
trend-cybertron
63
4
Llama Primus Base
MIT
A specialized pre-trained large language model for cybersecurity based on Llama-3.1-8B-Instruct, achieving a 15.88% improvement in comprehensive cybersecurity benchmark scores
Large Language Model Transformers English
L
trendmicro-ailab
58
10
Senecallm X Qwen2.5 7B CyberSecurity Q8 0 GGUF
MIT
This is a large language model for the cybersecurity domain based on the Qwen2.5-7B architecture, converted to GGUF format for use with llama.cpp.
Large Language Model English
S
Nekuromento
18
1
Cyner 2.0 DeBERTa V3 Base
MIT
CyNER 2.0 is a named entity recognition model specifically designed for the cybersecurity domain, based on the DeBERTa architecture, capable of identifying various cybersecurity-related entities.
Sequence Labeling Transformers English
C
PranavaKailash
164
2
Lily Cybersecurity 7B V0.2 8.0bpw H8 Exl2
Apache-2.0
Lily is a cybersecurity assistant fine-tuned based on Mistral-7B, specializing in the field of cybersecurity and hacking techniques.
Large Language Model Transformers English
L
LoneStriker
72
4
Cybert CyNER
A cybersecurity entity recognition model fine-tuned on the CyNER dataset based on CYBERT, used to identify named entities related to cyber threats.
Sequence Labeling Transformers
C
Cyber-ThreaD
31
1
ATTACK BERT
ATT&CK BERT is a cybersecurity-specific language model based on sentence-transformers, capable of mapping sentences describing attack behaviors into semantically meaningful embedding vectors.
Text Embedding Transformers
A
basel
11.79k
14
Secbert
Apache-2.0
SecBERT is a pre-trained language model trained on cybersecurity texts, optimized specifically for cybersecurity domain tasks.
Large Language Model Transformers English
S
jackaduma
40.03k
52
Secroberta
Apache-2.0
SecRoBERTa is a pretrained language model trained on cybersecurity texts, optimized for tasks in the cybersecurity domain.
Large Language Model Transformers English
S
jackaduma
16.75k
18
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase